Improved minimax predictive densities under Kullback–Leibler loss

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Improved Minimax Predictive Densities under Kullback – Leibler Loss

Let X|μ∼Np(μ,vxI ) and Y |μ∼Np(μ,vyI ) be independent p-dimensional multivariate normal vectors with common unknown mean μ. Based on only observing X = x, we consider the problem of obtaining a predictive density p̂(y|x) for Y that is close to p(y|μ) as measured by expected Kullback–Leibler loss. A natural procedure for this problem is the (formal) Bayes predictive density p̂U(y|x) under the unif...

متن کامل

Asymptotically minimax Bayes predictive densities

fθ log (fθ/f̂) is used to examine various ways of choosing prior distributions; the principal type of choice studied is minimax. We seek asymptotically least favorable predictive densities for which the corresponding asymptotic risk is minimax. A result resembling Stein’s paradox for estimating normal means by the maximum likelihood holds for the uniform prior in the multivariate location family...

متن کامل

Improved Minimax Prediction Under Kullback-Leibler Loss

Let X | μ ∼ Np(μ, vxI) and Y | μ ∼ Np(μ, vyI) be independent p-dimensional multivariate normal vectors with common unknown mean μ, and let p(x|μ) and p(y |μ) denote the conditional densities of X and Y . Based on only observing X = x, we consider the problem of obtaining a predictive distribution p̂(y |x) for Y that is close to p(y |μ) as measured by Kullback-Leibler loss. The natural straw man ...

متن کامل

Variational Minimax Estimation of Discrete Distributions under KL Loss

We develop a family of upper and lower bounds on the worst-case expected KL loss for estimating a discrete distribution on a finite numberm of points, given N i.i.d. samples. Our upper bounds are approximationtheoretic, similar to recent bounds for estimating discrete entropy; the lower bounds are Bayesian, based on averages of the KL loss under Dirichlet distributions. The upper bounds are con...

متن کامل

Minimax estimation of multivariate normalmean under balanced loss function 1

This paper considers simultaneous estimation of multivariate normal mean vector using Zellner's(1994) balanced loss function when 2 is known and unknown. We show that the usual estimator X is minimax and obtain a class of minimax estimators which have uniformly smaller risk than the usual estimator X. Also, we obtain the proper Bayes estimator relative to balanced loss function and nd the minim...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: The Annals of Statistics

سال: 2006

ISSN: 0090-5364

DOI: 10.1214/009053606000000155